An Empirical Study of ADMM for Nonconvex Problems

نویسندگان

  • Zheng Xu
  • Soham De
  • Mário A. T. Figueiredo
  • Christoph Studer
  • Tom Goldstein
چکیده

The alternating direction method of multipliers (ADMM) is a common optimization tool for solving constrained and non-differentiable problems. We provide an empirical study of the practical performance of ADMM on several nonconvex applications, including `0 regularized linear regression, `0 regularized image denoising, phase retrieval, and eigenvector computation. Our experiments suggest that ADMM performs well on a broad class of non-convex problems. Moreover, recently proposed adaptive ADMM methods, which automatically tune penalty parameters as the method runs, can improve algorithm efficiency and solution quality compared to ADMM with a non-tuned penalty.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence Analysis of ADMM for a Family of Nonconvex Problems

In this paper, we analyze the behavior of the well-known alternating direction method of multipliers (ADMM), for solving a family of nonconvex problems. Our focus is given to the well-known consensus and sharing problems, both of which have wide applications in machine learning. We show that in the presence of nonconvex objective, the classical ADMM is able to reach the set of stationary soluti...

متن کامل

A Distributed, Asynchronous and Incremental Algorithm for Nonconvex Optimization: An ADMM Based Approach

The alternating direction method of multipliers (ADMM) has been popular for solving many signal processing problems, convex or nonconvex. In this paper, we study an asynchronous implementation of the ADMM for solving a nonconvex nonsmooth optimization problem, whose objective is the sum of a number of component functions. The proposed algorithm allows the problem to be solved in a distributed, ...

متن کامل

Convergence rate bounds for a proximal ADMM with over-relaxation stepsize parameter for solving nonconvex linearly constrained problems

This paper establishes convergence rate bounds for a variant of the proximal alternating direction method of multipliers (ADMM) for solving nonconvex linearly constrained optimization problems. The variant of the proximal ADMM allows the inclusion of an over-relaxation stepsize parameter belonging to the interval (0, 2). To the best of our knowledge, all related papers in the literature only co...

متن کامل

Continuous Relaxation of MAP Inference: A Nonconvex Perspective

In this paper, we study a nonconvex continuous relaxation of MAP inference in discrete Markov random fields (MRFs). We show that for arbitrary MRFs, this relaxation is tight, and a discrete stationary point of it can be easily reached by a simple block coordinate descent algorithm. In addition, we study the resolution of this relaxation using popular gradient methods, and further propose a more...

متن کامل

Global Convergence of ADMM in Nonconvex Nonsmooth Optimization

In this paper, we analyze the convergence of the alternating direction method of multipliers (ADMM) for minimizing a nonconvex and possibly nonsmooth objective function, φ(x0, . . . , xp, y), subject to coupled linear equality constraints. Our ADMM updates each of the primal variables x0, . . . , xp, y, followed by updating the dual variable. We separate the variable y from xi’s as it has a spe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1612.03349  شماره 

صفحات  -

تاریخ انتشار 2016